Quadratic Mutual Information Feature Selection

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quadratic Mutual Information Feature Selection

We propose a novel feature selection method based on quadratic mutual information which has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct estimation of quadratic mutual information from data samples using Gaussian kernel functions, and can detect second order non-linear relations. Its main advantages are: (i) unified analysis of discrete and continuous dat...

متن کامل

Feature Evaluation using Quadratic Mutual Information

Methods of feature evaluation are developed and discussed based on information theoretical learning (ITL). Mutual information was shown in the literature to be more robust and precise to evaluate a feature set. In this paper; we propose to use quadratic mutual information (QMI) for feature evaluation. The concept of information potential leads to a more clearly physical meaning of the evaluatio...

متن کامل

On Estimating Mutual Information for Feature Selection

Mutual Information (MI) is a powerful concept from information theory used in many application fields. For practical tasks it is often necessary to estimate the Mutual Information from available data. We compare state of the art methods for estimating MI from continuous data, focusing on the usefulness for the feature selection task. Our results suggest that many methods are practically relevan...

متن کامل

Binary Feature Selection with Conditional Mutual Information

In a context of classi cation, we propose to use conditional mutual information to select a family of binary features which are individually discriminating and weakly dependent. We show that on a task of image classi cation, despite its simplicity, a naive Bayesian classi er based on features selected with this Conditional Mutual Information Maximization (CMIM) criterion performs as well as a c...

متن کامل

Feature selection using Joint Mutual Information Maximisation

Feature selection is used in many application areas relevant to expert and intelligent systems, such as data mining and machine learning, image processing, anomaly detection, bioinformatics and natural language processing. Feature selection based on information theory is a popular approach due its computational efficiency, scalability in terms of the dataset dimensionality, and independence fro...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Entropy

سال: 2017

ISSN: 1099-4300

DOI: 10.3390/e19040157